Tomographic Image Reconstruction Based on Minimization of Symmetrized Kullback-Leibler Divergence
نویسندگان
چکیده
منابع مشابه
Model Confidence Set Based on Kullback-Leibler Divergence Distance
Consider the problem of estimating true density, h(.) based upon a random sample X1,…, Xn. In general, h(.)is approximated using an appropriate in some sense, see below) model fƟ(x). This article using Vuong's (1989) test along with a collection of k(> 2) non-nested models constructs a set of appropriate models, say model confidence set, for unknown model h(.).Application of such confide...
متن کاملGeneralized Kullback-Leibler Divergence Minimization within a Scaled Bregman Framework
The generalized Kullback-Leibler divergence (K-Ld) in Tsallis statistics subjected to the additive duality of generalized statistics (dual generalized K-Ld) is reconciled with the theory of Bregman divergences for expectations defined by normal averages, within a measure theoretic framework. Specifically, it is demonstrated that the dual generalized K-Ld is a scaled Bregman divergence. The Pyth...
متن کاملNotes on Kullback-Leibler Divergence and Likelihood
The Kullback-Leibler (KL) divergence is a fundamental equation of information theory that quantifies the proximity of two probability distributions. Although difficult to understand by examining the equation, an intuition and understanding of the KL divergence arises from its intimate relationship with likelihood theory. We discuss how KL divergence arises from likelihood theory in an attempt t...
متن کاملDerivation of the PHD and CPHD Filters Based on Direct Kullback-Leibler Divergence Minimization
In this paper, we provide novel derivations of the probability hypothesis density (PHD) and cardinalised PHD (CPHD) filters without using probability generating functionals or functional derivatives. We show that both the PHD and CPHD filters fit in the context of assumed density filtering and implicitly perform Kullback-Leibler divergence (KLD) minimisations after the prediction and update ste...
متن کاملRényi Divergence and Kullback-Leibler Divergence
Rényi divergence is related to Rényi entropy much like Kullback-Leibler divergence is related to Shannon’s entropy, and comes up in many settings. It was introduced by Rényi as a measure of information that satisfies almost the same axioms as Kullback-Leibler divergence, and depends on a parameter that is called its order. In particular, the Rényi divergence of order 1 equals the Kullback-Leibl...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Mathematical Problems in Engineering
سال: 2018
ISSN: 1024-123X,1563-5147
DOI: 10.1155/2018/8973131